AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Domain-Adaptive Pretraining

# Domain-Adaptive Pretraining

Rigoberta Clinical
Other
RigoBERTa Clinical Edition is the state-of-the-art encoder language model for the Spanish clinical domain, developed through domain-adaptive pretraining on the largest publicly available Spanish clinical corpus, ClinText-SP.
Large Language Model Transformers Spanish
R
IIC
115
8
Jobbert De
A domain-adaptive language model optimized for German job advertisements, continuously pre-trained based on bert-base-german-cased
Large Language Model Transformers German
J
agne
96
4
Conflibert Scr Cased
Gpl-3.0
ConfliBERT is a pre-trained language model specifically designed for political conflicts and violent events, offering four different versions to suit various needs.
Large Language Model Transformers
C
snowood1
830
1
Bioredditbert Uncased
A BERT model initialized with BioBERT and further pretrained on health-related Reddit posts, specializing in processing medical texts from social media
Large Language Model English
B
cambridgeltl
295
5
Mengzi Bert Base Fin
Apache-2.0
Based on the mengzi-bert-base model, further trained with 20G of financial news and research report data, focusing on natural language processing tasks in the Chinese financial domain.
Large Language Model Transformers Chinese
M
Langboat
203
13
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase